Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Free, publicly-accessible full text available May 1, 2026
- 
            Free, publicly-accessible full text available January 1, 2026
- 
            While augmented reality (AR) headsets provide entirely new ways of seeing and interacting with data, traditional computing devices can play a symbiotic role when used in conjunction with AR as a hybrid user interface. A promising use case for this setup is situated analytics. AR can provide embedded views that are integrated with their physical referents, and a separate device such as a tablet can provide a familiar situated overview of the entire dataset being examined. While prior work has explored similar setups, we sought to understand how people perceive and make use of visualizations presented on both embedded visualizations (in AR) and situated visualizations (on a tablet) to achieve their own goals. To this end, we conducted an exploratory study using a scenario and task familiar to most: adjusting light levels in a smart home based on personal preference and energy usage. In a prototype that simulates AR in virtual reality, embedded visualizations are positioned next to lights distributed across an apartment, and situated visualizations are provided on a handheld tablet. We observed and interviewed 19 participants using the prototype. Participants were easily able to perform the task, though the extent the visualizations were used during the task varied, with some making decisions based on the data and others only on their own preferences. Our findings also suggest the two distinct roles that situated and embedded visualizations can have, and how this clear separation might improve user satisfaction and minimize attention-switching overheads in this hybrid user interface setup. We conclude by discussing the importance of considering the user's needs, goals, and the physical environment for designing and evaluating effective situated analytics applications.more » « less
- 
            Immersive environments enable users to engage in embodied interaction, enhancing the sensemaking processes involved in completing tasks such as immersive analytics. Previous comparative studies on immersive analytics using augmented and virtual realities have revealed that users employ different strategies for data interpretation and text-based analytics depending on the environment. Our study seeks to investigate how augmented and virtual reality influences sensemaking processes in quantitative immersive analytics. Our results, derived from a diverse group of participants, indicate that users demonstrate comparable performance in both environments. However, it was observed that users exhibit a higher tolerance for cognitive load in VR and travel further in AR. Based on our findings, we recommend providing users with the option to switch between AR and VR, thereby enabling them to select an environment that aligns with their preferences and task requirements.more » « less
- 
            Virtual Reality (VR) applications like OpenBrush offer artists access to 3D sketching tools within the digital 3D virtual space. These 3D sketching tools allow users to “paint” using virtual digital strokes that emulate real-world mark-making. Yet, users paint these strokes through (unimodal) VR controllers. Given that sketching in VR is a relatively nascent field, this paper investigates ways to expand our understanding of sketching in virtual space, taking full advantage of what an immersive digital canvas offers. Through a study conducted with the participation of artists, we identify potential methods for natural multimodal and unimodal interaction techniques in 3D sketching. These methods demonstrate ways to incrementally improve existing interaction techniques and incorporate artistic feedback into the design.more » « less
- 
            Immersive Analytics (IA) and consumer adoption of augmented reality (AR) and virtual reality (VR) head-mounted displays (HMDs) are both rapidly growing. When used in conjunction, stereoscopic IA environments can offer improved user understanding and engagement; however, it is unclear how the choice of stereoscopic display impacts user interactions within an IA environment. This paper presents a pilot study that examines the impact of stereoscopic display choice on object manipulation and environmental navigation using consumeravailable AR and VR HMDs. Our observations indicate that the display can impact how users manipulate virtual content and how they navigate the environment.more » « less
- 
            As we develop computing platforms for augmented reality (AR) head-mounted display (HMDs) technologies for social or workplace environments, understanding how users interact with notifications in immersive environments has become crucial. We researched effectiveness and user preferences of different interaction modalities for notifications, along with two types of notification display methods. In our study, participants were immersed in a simulated cooking environment using an AR-HMD, where they had to fulfill customer orders. During the cooking process, participants received notifications related to customer orders and ingredient updates. They were given three interaction modes for those notifications: voice commands, eye gaze and dwell, and hand gestures. To manage multiple notifications at once, we also researched two different notification list displays, one attached to the user’s hand and one in the world. Results indicate that participants preferred using their hands to interact with notifications and having the list of notifications attached to their hands. Voice and gaze interaction was perceived as having lower usability than touchmore » « less
- 
            Annotation in 3D user interfaces such as Augmented Reality (AR) and Virtual Reality (VR) is a challenging and promising area; however, there are not currently surveys reviewing these contributions. In order to provide a survey of annotations for Extended Reality (XR) environments, we conducted a structured literature review of papers that used annotation in their AR/VR systems from the period between 2001 and 2021. Our literature review process consists of several filtering steps which resulted in 103 XR publications with a focus on annotation. We classified these papers based on the display technologies, input devices, annotation types, target object under annotation, collaboration type, modalities, and collaborative technologies. A survey of annotation in XR is an invaluable resource for researchers and newcomers. Finally, we provide a database of the collected information for each reviewed paper. This information includes applications, the display technologies and its annotator, input devices, modalities, annotation types, interaction techniques, collaboration types, and tasks for each paper. This database provides a rapid access to collected data and gives users the ability to search or filter the required information. This survey provides a starting point for anyone interested in researching annotation in XR environments.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
